Tiiny AI has introduced Pocket Lab, a new device that could change the balance in the world of artificial intelligence. Marketed as the world’s smallest AI supercomputer, it has earned a place in the Guinness World Records for its ability to run models in the 100 billion-parameter range locally. The American startup Tiiny states that this mini-computer can perform advanced inference tasks without needing cloud connectivity, external servers, or a separate graphics card.
Say goodbye to the cloud: Run models with 120 billion parameters in your pocket
The device’s most striking feature is its ability to run massive language models with between 10 billion and 120 billion parameters completely offline. The company emphasizes that data privacy is protected and network latency is eliminated because all operations are performed within the device. Tiiny AI CEO Samar Bhoj argues that intelligence should belong to humans, not data centers, and that this device will make AI more personal and accessible.
On the technical side, the device promises high performance while remaining within the 65W power consumption limit. At the heart of the system is a 12-core ARMv9.2 processor and a dedicated AI module providing approximately 190 TOPS of processing power. This is accompanied by 80GB of LPDDR5X memory and a 1TB SSD. Physically resembling a large external hard drive, the device efficiently distributes workloads using software technologies like TurboSparse and PowerInfer, eliminating the need for expensive accelerators.
There’s also a confusing detail in the company’s statements. Tiiny mentions offering “OTA hardware upgrades” for the device, but since updating physical hardware remotely isn’t possible, this is thought to be a marketing glitch or a misrepresented software update. While independent test results haven’t been published yet, the device’s privacy and low-cost advantages are noteworthy.
While the promise of server-class performance from such a small device has generated excitement in the tech world, real-world testing remains to be seen. What are your thoughts? Will we be running massive AI models on small devices in our homes in the future?
{{user}} {{datetime}}
{{text}}